A note on privacy preserving iteratively reweighted least squares
نویسندگان
چکیده
Iteratively reweighted least squares (IRLS) is a widely-used method in machine learning to estimate the parameters in the generalised linear models. In particular, IRLS for L1 minimisation under the linear model provides a closed-form solution in each step, which is a simple multiplication between the inverse of the weighted second moment matrix and the weighted first moment vector. When dealing with privacy sensitive data, however, developing a privacy preserving IRLS algorithm faces two challenges. First, due to the inversion of the second moment matrix, the usual sensitivity analysis in differential privacy incorporating a single datapoint perturbation gets complicated and often requires unrealistic assumptions. Second, due to its iterative nature, a significant cumulative privacy loss occurs. However, adding a high level of noise to compensate for the privacy loss hinders from getting accurate estimates. Here, we develop a practical algorithm that overcomes these challenges and outputs privatised and accurate IRLS solutions. In our method, we analyse the sensitivity of each moments separately and treat the matrix inversion and multiplication as a post-processing step, which simplifies the sensitivity analysis. Furthermore, we apply the concentrated differential privacy formalism, a more relaxed version of differential privacy, which requires adding a significantly less amount of noise for the same level of privacy guarantee, compared to the conventional and advanced compositions of differentially private mechanisms.
منابع مشابه
Feature Preserving Sketching of Volume Data
In this paper, we present a novel method for extracting feature lines from volume data sets. This leads to a reduction of visual complexity and provides an abstraction of the original data to important structural features. We employ a new iteratively reweighted least-squares approach that allows us to detect sharp creases and to preserve important features such as corners or intersection of fea...
متن کاملNon-Negative Matrix Factorisation of Compressively Sampled Non-Negative Signals
The new emerging theory of Compressive Sampling has demonstrated that by exploiting the structure of a signal, it is possible to sample a signal below the Nyquist rate and achieve perfect reconstruction. In this short note, we employ Non-negative Matrix Factorisation in the context of Compressive Sampling and propose two NMF algorithms for signal recovery—one of which utilises Iteratively Rewei...
متن کاملOn the Properties of Preconditioners for Robust Linear Regression
In this paper, we consider solving the robust linear regression problem, y = Ax+ ε by Newton’s method and iteratively reweighted least squares method. We show that each of these methods can be combined with preconditioned conjugate gradient least squares algorithm to solve large, sparse, rectangular systems of linear, algebraic equations efficiently. We consider the constant preconditioner A A ...
متن کاملA Parallel Min-Cut Algorithm using Iteratively Reweighted Least Squares
We present a parallel algorithm for the undirected s-t min-cut problem with floating-point valued weights. Our overarching algorithm uses an iteratively reweighted least squares framework. This generates a sequence of Laplacian linear systems, which we solve using parallel matrix algorithms. Our overall implementation is up to 30-times faster than a serial solver when using 128 cores.
متن کاملConvergence Analysis of Generalized Iteratively Reweighted Least Squares Algorithms on Convex Function Spaces
The computation of robust regression estimates often relies on minimization of a convex functional on a convex set. In this paper we discuss a general technique for a large class of convex functionals to compute the minimizers iteratively which is closely related to majorization-minimization algorithms. Our approach is based on a quadratic approximation of the functional to be minimized and inc...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1605.07511 شماره
صفحات -
تاریخ انتشار 2016